Console Output

Training and evaluating model for: Dryer
Dataset length: 22402 windows


 NILMModel(
  (conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
  (lstm): LSTM(9, 256, num_layers=5, batch_first=True, dropout=0.1)
  (dropout): Dropout(p=0.1, inplace=False)
  (relu): ReLU()
  (output_layer): Linear(in_features=256, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.010861
Validation Loss: 0.011411
Epoch [2/300], Train Loss: 0.006275
Validation Loss: 0.002322
Epoch [3/300], Train Loss: 0.002497
Validation Loss: 0.002105
Epoch [4/300], Train Loss: 0.001883
Validation Loss: 0.001701
Epoch [5/300], Train Loss: 0.001242
Validation Loss: 0.001174
Epoch [6/300], Train Loss: 0.001044
Validation Loss: 0.001503
Epoch [7/300], Train Loss: 0.001451
Validation Loss: 0.000735
Epoch [8/300], Train Loss: 0.000843
Validation Loss: 0.000614
Epoch [9/300], Train Loss: 0.000750
Validation Loss: 0.000810
Epoch [10/300], Train Loss: 0.000760
Validation Loss: 0.000660
Epoch [11/300], Train Loss: 0.000651
Validation Loss: 0.000575
Epoch [12/300], Train Loss: 0.000588
Validation Loss: 0.000520
Epoch [13/300], Train Loss: 0.000595
Validation Loss: 0.000484
Epoch [14/300], Train Loss: 0.000567
Validation Loss: 0.000492
Epoch [15/300], Train Loss: 0.000568
Validation Loss: 0.000533
Epoch [16/300], Train Loss: 0.000507
Validation Loss: 0.000438
Epoch [17/300], Train Loss: 0.000557
Validation Loss: 0.000460
Epoch [18/300], Train Loss: 0.000461
Validation Loss: 0.000400
Epoch [19/300], Train Loss: 0.000480
Validation Loss: 0.000395
Epoch [20/300], Train Loss: 0.000451
Validation Loss: 0.000405
Epoch [21/300], Train Loss: 0.000427
Validation Loss: 0.000374
Epoch [22/300], Train Loss: 0.000425
Validation Loss: 0.000464
Epoch [23/300], Train Loss: 0.000450
Validation Loss: 0.000370
Epoch [24/300], Train Loss: 0.000419
Validation Loss: 0.000358
Epoch [25/300], Train Loss: 0.000422
Validation Loss: 0.000386
Epoch [26/300], Train Loss: 0.000391
Validation Loss: 0.000366
Epoch [27/300], Train Loss: 0.000381
Validation Loss: 0.000415
Epoch [28/300], Train Loss: 0.000392
Validation Loss: 0.000345
Epoch [29/300], Train Loss: 0.000381
Validation Loss: 0.000413
Epoch [30/300], Train Loss: 0.000389
Validation Loss: 0.000329
Epoch [31/300], Train Loss: 0.000348
Validation Loss: 0.000296
Epoch [32/300], Train Loss: 0.000331
Validation Loss: 0.000297
Epoch [33/300], Train Loss: 0.000355
Validation Loss: 0.000286
Epoch [34/300], Train Loss: 0.000350
Validation Loss: 0.000272
Epoch [35/300], Train Loss: 0.000319
Validation Loss: 0.000289
Epoch [36/300], Train Loss: 0.000312
Validation Loss: 0.000271
Epoch [37/300], Train Loss: 0.000309
Validation Loss: 0.000351
Epoch [38/300], Train Loss: 0.000290
Validation Loss: 0.000264
Epoch [39/300], Train Loss: 0.000377
Validation Loss: 0.000406
Epoch [40/300], Train Loss: 0.000332
Validation Loss: 0.000254
Epoch [41/300], Train Loss: 0.000290
Validation Loss: 0.000245
Epoch [42/300], Train Loss: 0.000256
Validation Loss: 0.000229
Epoch [43/300], Train Loss: 0.000247
Validation Loss: 0.000230
Epoch [44/300], Train Loss: 0.000244
Validation Loss: 0.000239
Epoch [45/300], Train Loss: 0.000245
Validation Loss: 0.000215
Epoch [46/300], Train Loss: 0.000228
Validation Loss: 0.000213
Epoch [47/300], Train Loss: 0.000229
Validation Loss: 0.000203
Epoch [48/300], Train Loss: 0.000211
Validation Loss: 0.000194
Epoch [49/300], Train Loss: 0.000212
Validation Loss: 0.000199
Epoch [50/300], Train Loss: 0.000211
Validation Loss: 0.000201
Epoch [51/300], Train Loss: 0.000207
Validation Loss: 0.000191
Epoch [52/300], Train Loss: 0.000250
Validation Loss: 0.000183
Epoch [53/300], Train Loss: 0.000199
Validation Loss: 0.000179
Epoch [54/300], Train Loss: 0.000202
Validation Loss: 0.000172
Epoch [55/300], Train Loss: 0.000179
Validation Loss: 0.000161
Epoch [56/300], Train Loss: 0.000185
Validation Loss: 0.000171
Epoch [57/300], Train Loss: 0.000207
Validation Loss: 0.000162
Epoch [58/300], Train Loss: 0.000172
Validation Loss: 0.000160
Epoch [59/300], Train Loss: 0.000177
Validation Loss: 0.000164
Epoch [60/300], Train Loss: 0.000164
Validation Loss: 0.000160
Epoch [61/300], Train Loss: 0.000158
Validation Loss: 0.000152
Epoch [62/300], Train Loss: 0.000169
Validation Loss: 0.000150
Epoch [63/300], Train Loss: 0.000184
Validation Loss: 0.000142
Epoch [64/300], Train Loss: 0.000149
Validation Loss: 0.000165
Epoch [65/300], Train Loss: 0.000162
Validation Loss: 0.000140
Epoch [66/300], Train Loss: 0.000147
Validation Loss: 0.000136
Epoch [67/300], Train Loss: 0.000142
Validation Loss: 0.000132
Epoch [68/300], Train Loss: 0.000149
Validation Loss: 0.000132
Epoch [69/300], Train Loss: 0.000132
Validation Loss: 0.000161
Epoch [70/300], Train Loss: 0.000139
Validation Loss: 0.000129
Epoch [71/300], Train Loss: 0.000131
Validation Loss: 0.000120
Epoch [72/300], Train Loss: 0.000122
Validation Loss: 0.000122
Epoch [73/300], Train Loss: 0.000129
Validation Loss: 0.000152
Epoch [74/300], Train Loss: 0.000177
Validation Loss: 0.000181
Epoch [75/300], Train Loss: 0.000136
Validation Loss: 0.000118
Epoch [76/300], Train Loss: 0.000117
Validation Loss: 0.000120
Epoch [77/300], Train Loss: 0.000116
Validation Loss: 0.000100
Epoch [78/300], Train Loss: 0.000113
Validation Loss: 0.000105
Epoch [79/300], Train Loss: 0.000116
Validation Loss: 0.000115
Epoch [80/300], Train Loss: 0.000170
Validation Loss: 0.000153
Epoch [81/300], Train Loss: 0.000128
Validation Loss: 0.000113
Epoch [82/300], Train Loss: 0.000110
Validation Loss: 0.000094
Epoch [83/300], Train Loss: 0.000103
Validation Loss: 0.000095
Epoch [84/300], Train Loss: 0.000102
Validation Loss: 0.000089
Epoch [85/300], Train Loss: 0.000097
Validation Loss: 0.000086
Epoch [86/300], Train Loss: 0.000103
Validation Loss: 0.000090
Epoch [87/300], Train Loss: 0.000097
Validation Loss: 0.000088
Epoch [88/300], Train Loss: 0.000096
Validation Loss: 0.000089
Epoch [89/300], Train Loss: 0.000107
Validation Loss: 0.000164
Epoch [90/300], Train Loss: 0.000114
Validation Loss: 0.000094
Epoch [91/300], Train Loss: 0.000095
Validation Loss: 0.000098
Epoch [92/300], Train Loss: 0.000090
Validation Loss: 0.000079
Epoch [93/300], Train Loss: 0.000086
Validation Loss: 0.000089
Epoch [94/300], Train Loss: 0.000086
Validation Loss: 0.000076
Epoch [95/300], Train Loss: 0.000084
Validation Loss: 0.000080
Epoch [96/300], Train Loss: 0.000091
Validation Loss: 0.000089
Epoch [97/300], Train Loss: 0.000099
Validation Loss: 0.000082
Epoch [98/300], Train Loss: 0.000097
Validation Loss: 0.000077
Epoch [99/300], Train Loss: 0.000085
Validation Loss: 0.000086
Epoch [100/300], Train Loss: 0.000078
Validation Loss: 0.000086
Epoch [101/300], Train Loss: 0.000083
Validation Loss: 0.000090
Epoch [102/300], Train Loss: 0.000087
Validation Loss: 0.000071
Epoch [103/300], Train Loss: 0.000122
Validation Loss: 0.000151
Epoch [104/300], Train Loss: 0.000116
Validation Loss: 0.000096
Epoch [105/300], Train Loss: 0.000090
Validation Loss: 0.000084
Epoch [106/300], Train Loss: 0.000094
Validation Loss: 0.000078
Epoch [107/300], Train Loss: 0.000077
Validation Loss: 0.000077
Epoch [108/300], Train Loss: 0.000073
Validation Loss: 0.000079
Epoch [109/300], Train Loss: 0.000075
Validation Loss: 0.000068
Epoch [110/300], Train Loss: 0.000072
Validation Loss: 0.000066
Epoch [111/300], Train Loss: 0.000070
Validation Loss: 0.000069
Epoch [112/300], Train Loss: 0.000070
Validation Loss: 0.000066
Epoch [113/300], Train Loss: 0.000071
Validation Loss: 0.000068
Epoch [114/300], Train Loss: 0.000075
Validation Loss: 0.000067
Epoch [115/300], Train Loss: 0.000069
Validation Loss: 0.000073
Epoch [116/300], Train Loss: 0.000072
Validation Loss: 0.000070
Epoch [117/300], Train Loss: 0.000068
Validation Loss: 0.000066
Epoch [118/300], Train Loss: 0.000065
Validation Loss: 0.000074
Epoch [119/300], Train Loss: 0.000067
Validation Loss: 0.000064
Epoch [120/300], Train Loss: 0.000066
Validation Loss: 0.000064
Epoch [121/300], Train Loss: 0.000063
Validation Loss: 0.000067
Epoch [122/300], Train Loss: 0.000074
Validation Loss: 0.000071
Epoch [123/300], Train Loss: 0.000070
Validation Loss: 0.000072
Epoch [124/300], Train Loss: 0.000063
Validation Loss: 0.000062
Epoch [125/300], Train Loss: 0.000063
Validation Loss: 0.000060
Epoch [126/300], Train Loss: 0.000061
Validation Loss: 0.000057
Epoch [127/300], Train Loss: 0.000062
Validation Loss: 0.000061
Epoch [128/300], Train Loss: 0.000060
Validation Loss: 0.000060
Epoch [129/300], Train Loss: 0.000062
Validation Loss: 0.000070
Epoch [130/300], Train Loss: 0.000059
Validation Loss: 0.000066
Epoch [131/300], Train Loss: 0.000061
Validation Loss: 0.000067
Epoch [132/300], Train Loss: 0.000057
Validation Loss: 0.000061
Epoch [133/300], Train Loss: 0.000057
Validation Loss: 0.000060
Epoch [134/300], Train Loss: 0.000056
Validation Loss: 0.000072
Epoch [135/300], Train Loss: 0.000059
Validation Loss: 0.000059
Epoch [136/300], Train Loss: 0.000056
Validation Loss: 0.000057
Early stopping triggered

Evaluating model for: Dryer
Validation MAE: 0.635975 W
Validation MSE: 57.174500 W²
Validation RMSE: 7.561382 W
Signal Aggregate Error (SAE): 0.001089
Normalized Disaggregation Error (NDE): 0.068977

      

Training and Validation Loss

Training Loss Plot

Interactive Plot